Absorbing Markov chain definition

Search

Absorbing Markov chain

Absorbing Markov chain logo #21000 In the mathematical theory of probability, an absorbing Markov chain is a Markov chain in which every state can reach an absorbing state. An absorbing state is a state that, once entered, cannot be left. Like general Markov chains, there can be continuous-time absorbing Markov chains with an infinite state space. However, this article concentrates...
Found on http://en.wikipedia.org/wiki/Absorbing_Markov_chain
No exact match found.